Commentary: Leaks Renew Concern over Facebook’s Fact-Checking Sway

Facebook logo with smartphone showing lock in front
by Kalev Leetaru

 

This week the Wall Street Journal unveiled “The Facebook Files” – an investigative series based on leaked internal Facebook materials that offer an unvarnished look at how the social media giant sees its platform and its impact on society. A central theme of the reporting is the degree to which Facebook’s own research is at odds with its public statements, and that internally it has recognized the harms the platform causes for society even while publicly touting its benefits.

The Journal’s reporting raises myriad concerns over the state of social platforms generally today, from Instagram’s toxic influence on teenage girls to the impact of algorithmic changes on political discourse to how Facebook secretly shields influential users from its content moderation rules.

Given the growing influence of fact-checkers as the ultimate arbitrators of “truth” in the digital world, the Journal also reported that their verdicts may not be as independent as publicly portrayed: “Facebook has asked fact-checking partners to retroactively change their findings on posts from high-profile accounts.”

Asked by RealClearPolitics how many times it has intervened in fact-checking verdicts and under what circumstances it asks fact-checkers to change their rulings, a Facebook spokesperson did not answer, pointing instead to its generic fact-checking FAQ. Asked if the company would deny on the record having ordered a fact-checking partner to change a verdict, the company did not respond.

The International Fact-Checking Network (IFCN), which has established the set of standards to which most major fact-checkers adhere, did not respond to multiple requests for comment regarding whether it was aware of any of its signatories receiving and/or honoring requests from Facebook to change their verdicts.

Asked whether PolitiFact had ever received a request from Facebook to change one of its verdicts, if it had ever acquiesced and if it is aware of such requests to other fact-checkers, its executive director, Aaron Sharockman, responded that it is in the midst of “fact-finding.” Asked whether PolitiFact could at least confirm that it itself had never received or honored such requests, Sharockman responded that “any comment we have, we’ll make in the manner and time of our choosing.”

For an industry built on trust and transparency, it is remarkable that neither the IFCN nor PolitiFact were forthcoming on these allegations regarding Facebook’s request to protect powerful causes and people. Yet if true, this would largely undermine and delegitimize their work if the powerful were able to ensure favorable verdicts for their falsehoods.

Facebook helps fund the fact-checking community, accounting for more than 5% of PolitiFact’s revenue in 2020 and is one of the top funders of many other checking operations. If fact-checkers are facing pressure to change their verdicts, even if they don’t ultimately honor those requests, such demands could have a chilling effect on their independence. Given fact-checkers’ ability to halt the online distribution of stories and ideas they deem false or misleading, the public should have a right to know the degree to which outside forces are shaping their rulings.

In fact, last year the business magazine Fast Company confirmed that fact-checking organizations, including IFCN signatories, have indeed changed their verdicts under pressure from Facebook. In at least one case, internal Facebook correspondence shows that an IFCN signatory changed its verdict from “False” to “Partly False” (which carries fewer penalties) after the social media platform flagged that the publisher being fact-checked was a major advertiser whose spending could be impacted by the harsher rating.

Asked to comment on the apparent discrepancy between Facebook’s public portrayal of fact-checkers’ independence and its interventions to change their verdicts, a spokesperson confirmed that the company does intervene when it believes a different rating should have been applied, but would not comment on how often this has occurred.

How many times have fact-checkers changed their verdicts at the request of Facebook or other major funders? Have they ever changed verdicts at the request of influential politicians? We have no idea, and the organizations’ silence on the Journal’s reporting reminds us that the public should not expect transparency when it comes to the operations of fact-checkers or the social platforms they work with, despite their outsized power over the digital public square.

Outside of leaks of internal company documents, as in the 2017 leak to The Guardian of Facebook’s internal moderation guidelines or the Journal’s current series, the only real insights we currently have are outside researchers’ probes of social platforms’ inner workings. Facebook is increasingly pushing back on such efforts.

Last month, Facebook disabled the accounts of a New York University project that asked volunteers to install a browser plug-in to collect information on how the ads they saw on the platform were targeted to them. While Facebook publishes a database of ads that run on its platforms, it notably does not provide access regarding to whom each ad is targeted. For example, the database shows that 76% of the appearances of a Pennsylvania Democratic Party ad about school boards was shown to women. Was that because Facebook’s algorithms believe women care more about education issues (raising potential algorithmic bias concerns) or did the party explicitly target women (contrary to Democrats’ push against gender stereotypes)? There is simply no way to know.

Asked why it does not publish this information for political ads, the company would only confirm that it has no plans to do so. One possibility, however, is that a closer look at how politicians target their ads would reveal uncomfortable truths about how they see America and potentially expose racial and gender stereotyping at odds with their public commitments.

In the end, why does all of this matter? In Facebook’s own words, it matters because even a single algorithmic tweak can influence policymaking around the world, forcing political parties “to skew negative in their communications on Facebook, with the downstream effect of leading them into more extreme policy positions.” In the case of one Polish political party, its posts changed from 50% positive to 80% negative exclusively because of Facebook’s algorithm change to prioritize divisive content. As Facebook ultimately summarized, many political parties, “including those that have shifted strongly to the negative, worry about the long-term effects on democracy.”

– – –

RealClear Media Fellow Kalev Leetaru is a senior fellow at the George Washington University Center for Cyber & Homeland Security. His past roles include fellow in residence at Georgetown University’s Edmund A. Walsh School of Foreign Service and member of the World Economic Forum’s Global Agenda Council on the Future of Government.
Photo “Facebook” by Thoughtcatalog CC BY 2.0.

 

 

 

 

 

 

 


Content created by RealClearInvestigations is available without charge to any eligible news publisher. For republishing terms, please contact nsanchez@realclearfoundation.com.

Related posts

Comments

');jQuery('.description').after('');jQuery('#headerimg').before('');jQuery('#headerimg').after('');jQuery('h1').before('');jQuery('h1').after('');jQuery('h1').before('');jQuery('h1').after('');});-->